127 research outputs found
Determination of measurement uncertainty by Monte Carlo simulation
Modern coordinate measurement machines (CMM) are universal tools to measure
geometric features of complex three-dimensional workpieces. To use them as
reliable means of quality control, the suitability of the device for the
specific measurement task has to be proven. Therefore, the ISO 14253 standard
requires, knowledge of the measurement uncertainty and, that it is in
reasonable relation with the specified tolerances. Hence, the determination of
the measurement uncertainty, which is a complex and also costly task, is of
utmost importance. The measurement uncertainty is usually influenced by several
contributions of various sources. Among those of the machine itself, e.g.,
guideway errors and the influence of the probe and styli play an important
role. Furthermore, several properties of the workpiece, such as its form
deviations and surface roughness, have to be considered. Also the environmental
conditions, i.e., temperature and its gradients, pressure, relative humidity
and others contribute to the overall measurement uncertainty. Currently, there
are different approaches to determine task-specific measurement uncertainties.
This work reports on recent advancements extending the well-established method
of PTB's Virtual Coordinate Measuring Machine (VCMM) to suit present-day needs
in industrial applications. The VCMM utilizes numerical simulations to
determine the task-specific measurement uncertainty incorporating broad
knowledge about the contributions of, e.g., the used CMM, the environment and
the workpiece
Investigating the Potential of the Inter-IXP Multigraph for the Provisioning of Guaranteed End-to-End Services
In this work, we propose utilizing the rich connectivity between IXPs and
ISPs for inter-domain path stitching, supervised by centralized QoS brokers. In
this context, we highlight a novel abstraction of the Internet topology, i.e.,
the inter-IXP multigraph composed of IXPs and paths crossing the domains of
their shared member ISPs. This can potentially serve as a dense Internet-wide
substrate for provisioning guaranteed end-to-end (e2e) services with high path
diversity and global IPv4 address space reach. We thus map the IXP multigraph,
evaluate its potential, and introduce a rich algorithmic framework for path
stitching on such graph structures.Comment: Proceedings of ACM SIGMETRICS '15, pages 429-430, 1/1/2015. arXiv
admin note: text overlap with arXiv:1611.0264
Untersuchung von Defektreaktionen in Silicium mittels Ladungsträger-Lebensdauer-Spektroskopie
In dieser Arbeit werden die Grundlagen der Temperatur- und Injektionsabhängigen Lebensdauerspektropie (T-IDLS) erläutert, welche eine extrem sensitive Methode der Materialcharakterisierung von Halbleitern ist. Anschließend werden diese auf gezielt mit Eisen, Kupfer und Nickel kontaminierte Modellmaterialien (Siliciumwafer) angewandt. Dabei wird gezeigt, dass sich mittels T-IDLS nicht nur die Parametrisierung der durch Kontamination erzeugten Defekte (energetische Lage in der Bandlücke, Verhältnis der Einfangquerschnitte) bestimmen lässt. Sondern es lassen sich auch chemische Reaktionen der einzelnen Defekte in den Modellmaterialien nachweisen (durch Änderung der Defektkonzentration) und darüber hinaus auch nach den Gesetzen der Thermodynamik und Kinetik auswerten. Abschließend werden die Ergebnisse auf die Simulation des Einflusses der unterschiedlichen Defekte auf die Effizienz von Passivated Emitter and Rear Cell (PERC) Solarzellen angewandt
Edge Replication Strategies for Wide-Area Distributed Processing
The rapid digitalization across industries comes with many challenges. One key problem is how the ever-growing and volatile data generated at distributed locations can be efficiently processed to inform decision making and improve products. Unfortunately, wide-area network capacity cannot cope with the growth of the data at the network edges. Thus, it is imperative to decide which data should be processed in-situ at the edge and which should be transferred and analyzed in data centers.
In this paper, we study two families of proactive online data replication strategies, namely ski-rental and machine learning algorithms, to decide which data is processed at the edge, close to where it is generated, and which is transferred to a data center. Our analysis using real query traces from a Global 2000 company shows that such online replication strategies can significantly reduce data transfer volume in many cases up to 50% compared to naive approaches and achieve close to optimal performance. After analyzing their shortcomings for ease of use and performance, we propose a hybrid strategy that combines the advantages of both competitive and machine learning algorithms.EC/H2020/679158/EU/Resolving the Tussle in the Internet: Mapping, Architecture, and Policy Making/ResolutioNetBMBF, 01IS18025A, Verbundprojekt BIFOLD-BBDC: Berlin Institute for the Foundations of Learning and DataBMBF, 01IS18037A, Verbundprojekt BIFOLD-BZML: Berlin Institute for the Foundations of Learning and Dat
Network service chaining with optimized network function embedding supporting service decompositions
- …